Multi-Kernel Gaussian Processes

نویسندگان

  • Arman Melkumyan
  • Fabio Tozeto Ramos
چکیده

Although Gaussian process inference is usually formulated for a single output, in many machine learning problems the objective is to infer multiple tasks jointly, possibly exploring the dependencies between them to improve results. Real world examples of this problem include ore mining where the objective is to infer the concentration of several chemical components to assess the ore quality. Similarly, in robotics and control problems there are more than one actuator and the understanding and accurate modeling of the dependencies between the control outputs can significantly improve the controller.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

GP Kernels for Cross-Spectrum Analysis

Multi-output Gaussian processes provide a convenient framework for multi-task problems. An illustrative and motivating example of a multi-task problem is multi-region electrophysiological time-series data, where experimentalists are interested in both power and phase coherence between channels. Recently, Wilson and Adams (2013) proposed the spectral mixture (SM) kernel to model the spectral den...

متن کامل

Multi-class Classification with Dependent Gaussian Processes

We present a novel multi-output Gaussian process model for multi-class classification. We build on the formulation of Gaussian processes via convolution of white Gaussian noise processes with a parameterized kernel and present a new class of multi-output covariance functions. The latter allow for greater flexibility in modelling relationships between outputs while being parsimonious with regard...

متن کامل

Asymmetric kernel in Gaussian Processes for learning target variance

This work incorporates the multi-modality of the data distribution into a Gaussian Process regression model. We approach the problem from a discriminative perspective by learning, jointly over the training data, the target space variance in the neighborhood of a certain sample through metric learning. We start by using data centers rather than all training samples. Subsequently, each center sel...

متن کامل

Product Kernel Interpolation for Scalable Gaussian Processes

Recent work shows that inference for Gaussian processes can be performed efficiently using iterative methods that rely only on matrix-vector multiplications (MVMs). Structured Kernel Interpolation (SKI) exploits these techniques by deriving approximate kernels with very fast MVMs. Unfortunately, such strategies suffer badly from the curse of dimensionality. We develop a new technique for MVM ba...

متن کامل

Stochastic Variational Deep Kernel Learning

Deep kernel learning combines the non-parametric flexibility of kernel methods with the inductive biases of deep learning architectures. We propose a novel deep kernel learning model and stochastic variational inference procedure which generalizes deep kernel learning approaches to enable classification, multi-task learning, additive covariance structures, and stochastic gradient training. Spec...

متن کامل

Tensor Regression Meets Gaussian Processes

Low-rank tensor regression, a new model class that learns high-order correlation from data, has recently received considerable attention. At the same time, Gaussian processes (GP) are well-studied machine learning models for structure learning. In this paper, we demonstrate interesting connections between the two, especially for multi-way data analysis. We show that low-rank tensor regression i...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2011